Second-order noiseless source coding theorems

نویسندگان

چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Second-order noiseless source coding theorems

I. Kontoyiannis Appeared in IEEE Trans. Inform. Theory, 43, pp. 1339 1341, July 1997. Abstract { Shannon's celebrated source coding theorem can be viewed as a \one-sided law of large numbers." We formulate second-order noiseless source coding theorems for the deviation of the codeword lengths from the entropy. For a class of sources that includes Markov chains we prove a \one-sided central limi...

متن کامل

Noiseless Coding Theorems Corresponding to Fuzzy Entropies

Noiseless coding theorems connected with fuzzy entropies corresponding to Shannon, Renyi and Havrada and Charvat have been established. The upper bounds of these entropies in terms of mean code word lengths have been provided and some interesting properties of these codeword lengths have been studied.

متن کامل

Fair Noiseless Broadcast Source Coding

We present a noiseless source coding problem in a broadcast environment and supply a simple solution to this problem. A transmitter wishes to transmit a binary random vector to receivers, where receiver is only interested in the component A source encoding is a binary sequence which is chosen by the transmitter. The expected time at which the receiver determines (with probability one) is denote...

متن کامل

Universal noiseless coding

A&ruct-Universal coding is any asymptotically opt imum method of block-to-block memoryless source coding for sources with unknown parameters. This paper considers noiseless coding for such sources, primarily in terms of variable-length coding, with performance measured as a function of the coding redundancy relative to the per-letter conditional source entropy given the unknown parameter. It is...

متن کامل

Shannon ’ s noiseless coding theorem

In these notes we discuss Shannon’s noiseless coding theorem, which is one of the founding results of the field of information theory. Roughly speaking, we want to answer such questions as how much information is contained in some piece of data? One way to approach this question is to say that the data contains n bits of information (in average) if it can be coded by a binary sequence of length...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Information Theory

سال: 1997

ISSN: 0018-9448

DOI: 10.1109/18.605604